Second-order asymptotics in channel coding

نویسنده

  • Masahito Hayashi
چکیده

BASED on the channel coding theorem, there exists a sequence of codes for the given channel W such that the average error probability goes to 0 when the transmission rate R is less than CW . That is, if the number n of applications of the channel W is sufficiently large, the average error probability of a good code goes to 0. In order to evaluate the average error probability with finite n, we often use the exponential rate of decrease, which depends on the transmission rate R. However, such an exponential evaluation ignores the constant factor. Therefore, it is not clear whether exponential evaluation provides a good evaluation for the average error probability when the transmission rate R is close to the capacity. In fact, many researchers believe that, out of the known evaluations, the Gallager bound [1] gives the best upper bound of average error probability in the channel coding when the transmission rate is greater than the critical rate. This is because the Gallager bound provides the optimal exponential rate of decrease. In order to clarify this point, we introduce the second-order asymptotics in channel coding, in which, we describe the transmission length by CWn + R2 √ n. From a practical viewpoint, when the coding length is close to CWn, the second-order asymptotics gives a better evaluation of average error probability than the first-order asymptotics. In fact, the second error asymptotics has been applied for evaluation of the average error probability of random coding concerning the phase basis, which is essential to the security of quantum key distribution[9]. Therefore, it is appropriate to treat the second-order asymptotics from the applied viewpoint as well as the theoretical viewpoint. On the other hand, Hayashi[5] treated the second-order asymptotics of fixed-length source coding and intrinsic randomness using the method of information spectrum, which was initiated by Han-Verdú [3], and was mainly formulated by Han[4]. Hayashi[5] discussed the error probability when the compressed size is H(P )n + a √ n, where n is the size of input system and H(P ) is the entropy of the distribution P of the input system. In the method of information spectrum, we treat the general asymptotic formula, which gives the relationship between the asymptotic optimal performance and the normalized logarithm of the likelihood of the probability distribution. In order to treat a special case, we apply the general asymptotic formula to the respective information source and calculate the asymptotic stochastic behavior of the normalized logarithm of the likelihood. That is, in the information spectrum method, we have two steps, deriving the general asymptotic formula and applying the general asymptotic formula. With respect to fixed-length source coding and intrinsic randomness, the same relation holds concerning the general asymptotic formula in the second-order asymptotics. However, there is a difference concerning the application of the general asymptotic formula to the independent and identical distributions. That is, while the normalized logarithm of the likelihood approaches the entropy H(P ) in the probability in the first-order asymptotics, the stochastic behavior is asymptotically described by the normal distribution in the first-order asymptotics. In other words, in the second step, the first-order asymptotics corresponds to the law of large numbers, and the second-order asymptotics corresponds to the central limit theorem. In the present paper, we treat the channel coding in the second-order asymptotics, i.e., the case in which the transmission length is CWn+a √ n. Similar to the above-mentioned case, we employ the method of information spectrum. That is, we treat the general channel, which is the general sequence {Wn(y|x)} of probability distributions without structure. As shown by VerdúHan [2], this method enables us to characterize the asymptotic performance with only the random variable 1 n log W(y|x) Wn Pn (y) (the normalized logarithm of the likelihood ratio between the conditional distribution and the non-conditional distribution) without any further assumption, where W Pn(y) def = ∑ x P n(x)Wn(y|x). Concerning this general asymptotic formula, if we can suitably formulate theorems in the second-order asymptotics and establish an appropriate relationship between the firstorder asymptotics and the second-order asymptotics, we can easily extend proofs concerning the first-order asymptotics to those of the second-order asymptotics. Therefore, there is no serious difficulty in establishing the general asymptotic formula in the second-order asymptotics. In order to clarify this point, we present proofs of some relevant theorems in the first-order asymptotics, even though they are known.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fixed Error Asymptotics For Erasure and List Decoding

We derive the optimum second-order coding rates, known as second-order capacities, for erasure and list decoding. For erasure decoding for discrete memoryless channels, we show that second-order capacity is √ V Φ−1( t) where V is the channel dispersion and t is the total error probability, i.e., the sum of the erasure and undetected errors. We show numerically that the expected rate at finite b...

متن کامل

On the Dispersions of the Gel'fand-Pinsker Channel and Dirty Paper Coding

This paper studies second-order coding rates for memoryless channels with a state sequence known non-causally at the encoder. In the case of finite alphabets, an achievability result is obtained using constant-composition random coding, and by using a small fraction of the block to transmit the type of the state sequence. For error probabilities less than 1 2 , it is shown that the second-order...

متن کامل

Second-Order Asymptotics in Covert Communication

We study the firstand second-order asymptotics of covert communication with Pulse-Position Modulation (PPM) over binary-input Discrete Memoryless Channels (DMCs) for three different metrics of covertness. When covertness is measured in terms of the relative entropy between the channel output distributions induced with and without communication, we characterize the exact PPM second-order asympto...

متن کامل

Refined Asymptotics for Rate-Distortion using Gaussian Codebooks for Arbitrary Sources

The rate-distortion saddle-point problem considered by Lapidoth (1997) consists in finding the minimum rate to compress an arbitrary ergodic source when one is constrained to use a random Gaussian codebook and minimum (Euclidean) distance encoding is employed. We extend Lapidoth’s analysis in several directions in this paper. Firstly, we consider refined asymptotics. In particular, when the sou...

متن کامل

Non-Asymptotic Converse Bounds and Refined Asymptotics for Two Lossy Source Coding Problems

In this paper, we revisit two multi-terminal lossy source coding problems: the lossy source coding problem with side information available at the encoder and one of the two decoders, which we term as the Kaspi problem (Kaspi, 1994), and the multiple description coding problem with one semi-deterministic distortion measure, which we refer to as the Fu-Yeung problem (Fu and Yeung, 2002). For the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/0801.2242  شماره 

صفحات  -

تاریخ انتشار 2008